Sparse regularized local regression

نویسندگان

  • Diego Vidaurre
  • Concha Bielza
  • Pedro Larrañaga
چکیده

The intention is to provide a Bayesian formulation of regularized local linear regression, combined with techniques for optimal bandwidth selection. This approach arises from the idea that only those covariates that are found to be relevant for the regression function should be considered by the kernel function used to define the neighborhood of the point of interest. However, the regression function itself depends on the kernel function. A maximum posterior joint estimation of the regression parameters is given. Also, an alternative algorithm based on sampling techniques is developed for finding both the regression parameter distribution and the predictive distribution. © 2013 Elsevier B.V. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Local regularization assisted orthogonal least squares regression

A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is g...

متن کامل

Nonconvex Sparse Logistic Regression with Weakly Convex Regularization

In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem. The idea is based on the finding that a weakly convex function as an approximation of the `0 pseudo norm is able to better induce sparsity than the commonly used `1 norm. For a class of weakly convex sparsity inducing functions, we prove the nonconvexity of the corres...

متن کامل

The picasso Package for Nonconvex Regularized M-estimation in High Dimensions in R

We describe an R package named picasso, which implements a unified framework of pathwise coordinate optimization for a variety of sparse learning problems (Sparse Linear Regression, Sparse Logistic Regression and Sparse Column Inverse Operator), combined with distinct active set identification schemes (truncated cyclic, greedy, randomized and proximal gradient selection). Besides, the package p...

متن کامل

L1 regularized projection pursuit for additive model learning

In this paper, we present a L1 regularized projection pursuit algorithm for additive model learning. Two new algorithms are developed for regression and classification respectively: sparse projection pursuit regression and sparse Jensen-Shannon Boosting. The introduced L1 regularized projection pursuit encourages sparse solutions, thus our new algorithms are robust to overfitting and present be...

متن کامل

A multilevel framework for sparse optimization with application to inverse covariance estimation and logistic regression

Solving l1 regularized optimization problems is common in the fields of computational biology, signal processing and machine learning. Such l1 regularization is utilized to find sparse minimizers of convex functions. A well-known example is the LASSO problem, where the l1 norm regularizes a quadratic function. A multilevel framework is presented for solving such l1 regularized sparse optimizati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 62  شماره 

صفحات  -

تاریخ انتشار 2013